-
Notifications
You must be signed in to change notification settings - Fork 0
fix llm streaming #88
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🚀 Preview Deployments Ready!Your changes have been deployed to preview environments: 📦
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
1 issue found across 3 files
Prompt for AI agents (all issues)
Check if these issues are valid — if so, understand the root cause of each and fix them.
<file name="openrouter/server/tools/llm-binding.ts">
<violation number="1" location="openrouter/server/tools/llm-binding.ts:471">
P2: Use `APICallError.isInstance(error)` instead of a custom symbol-based type guard. The AI SDK provides this static method specifically for type checking, and relying on internal symbols (`Symbol.for('vercel.ai.error')`) is fragile and could break with SDK updates.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
| const isAPICallError = (error: unknown): error is APICallError => | ||
| typeof error === "object" && | ||
| error !== null && | ||
| Symbol.for("vercel.ai.error") in error && | ||
| Symbol.for("vercel.ai.error.AI_APICallError") in error; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
P2: Use APICallError.isInstance(error) instead of a custom symbol-based type guard. The AI SDK provides this static method specifically for type checking, and relying on internal symbols (Symbol.for('vercel.ai.error')) is fragile and could break with SDK updates.
Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At openrouter/server/tools/llm-binding.ts, line 471:
<comment>Use `APICallError.isInstance(error)` instead of a custom symbol-based type guard. The AI SDK provides this static method specifically for type checking, and relying on internal symbols (`Symbol.for('vercel.ai.error')`) is fragile and could break with SDK updates.</comment>
<file context>
@@ -467,6 +468,12 @@ const getUsageFromStream = (
];
};
+const isAPICallError = (error: unknown): error is APICallError =>
+ typeof error === "object" &&
+ error !== null &&
</file context>
| const isAPICallError = (error: unknown): error is APICallError => | |
| typeof error === "object" && | |
| error !== null && | |
| Symbol.for("vercel.ai.error") in error && | |
| Symbol.for("vercel.ai.error.AI_APICallError") in error; | |
| const isAPICallError = (error: unknown): error is APICallError => | |
| APICallError.isInstance(error); |
Summary by cubic
Fixed real-time LLM streaming for OpenRouter by using the correct provider API and returning proper error responses. This prevents broken streams and surfaces accurate status, headers, and body on failures.
Bug Fixes
Dependencies
Written for commit efc81fb. Summary will update on new commits.